Data classification with multilayer perceptrons using a generalized error function
نویسندگان
چکیده
The learning process of a multilayer perceptron requires the optimization of an error function E(y,t) comparing the predicted output, y, and the observed target, t. We review some usual error functions, analyze their mathematical properties for data classification purposes, and introduce a new one, E(Exp), inspired by the Z-EDM algorithm that we have recently proposed. An important property of E(Exp) is its ability to emulate the behavior of other error functions by the sole adjustment of a real-valued parameter. In other words, E(Exp) is a sort of generalized error function embodying complementary features of other functions. The experimental results show that the flexibility of the new, generalized, error function allows one to obtain the best results achievable with the other functions with a performance improvement in some cases.
منابع مشابه
Support Vector Machine Based Facies Classification Using Seismic Attributes in an Oil Field of Iran
Seismic facies analysis (SFA) aims to classify similar seismic traces based on amplitude, phase, frequency, and other seismic attributes. SFA has proven useful in interpreting seismic data, allowing significant information on subsurface geological structures to be extracted. While facies analysis has been widely investigated through unsupervised-classification-based studies, there are few cases...
متن کاملEnlarging Training Sets for Neural Networks
A study is presented to compare the performance of multilayer perceptrons, radial basis function networks, and probabilistic neural networks for classification. In many classification problems, probabilistic neural networks have outperformed other neural classifiers. Unfortunately, with this kind of networks, the number of required operations to classify one pattern directly depends on the numb...
متن کاملError back-propagation algorithm for classification of imbalanced data
Classification of imbalanced data is pervasive but it is a difficult problem to solve. In order to improve the classification of imbalanced data, this letter proposes a new error function for the error backpropagation algorithm of multilayer perceptrons. The error function intensifies weight-updating for the minority class and weakens weight-updating for the majority class. We verify the effect...
متن کاملneuralnet: Training of Neural Networks
Artificial neural networks are applied in many situations. neuralnet is built to train multi-layer perceptrons in the context of regression analyses, i.e. to approximate functional relationships between covariates and response variables. Thus, neural networks are used as extensions of generalized linear models. neuralnet is a very flexible package. The backpropagation algorithm and three versio...
متن کاملThe Effect of Training Set Size for the Performance of Neural Networks of Classification
Even though multilayer perceptrons and radial basis function networks belong to the class of artificial neural networks and they are used for similar tasks, they have very different structures and training mechanisms. So, some researchers showed better performance with radial basis function networks, while others showed some different results with multilayer perceptrons. This paper compares the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural networks : the official journal of the International Neural Network Society
دوره 21 9 شماره
صفحات -
تاریخ انتشار 2008